Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
International Journal of High Performance Computing Applications ; 37(1):46478.0, 2023.
Article in English | Scopus | ID: covidwho-2239171

ABSTRACT

This paper describes an integrated, data-driven operational pipeline based on national agent-based models to support federal and state-level pandemic planning and response. The pipeline consists of (i) an automatic semantic-aware scheduling method that coordinates jobs across two separate high performance computing systems;(ii) a data pipeline to collect, integrate and organize national and county-level disaggregated data for initialization and post-simulation analysis;(iii) a digital twin of national social contact networks made up of 288 Million individuals and 12.6 Billion time-varying interactions covering the US states and DC;(iv) an extension of a parallel agent-based simulation model to study epidemic dynamics and associated interventions. This pipeline can run 400 replicates of national runs in less than 33 h, and reduces the need for human intervention, resulting in faster turnaround times and higher reliability and accuracy of the results. Scientifically, the work has led to significant advances in real-time epidemic sciences. © The Author(s) 2022.

2.
Journal of Experimental Biology and Agricultural Sciences ; 10(6):1215-1221, 2022.
Article in English | Scopus | ID: covidwho-2217792

ABSTRACT

The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) Omicron versions have been the sole one circulating for quite some time. Subvariants BA.1, BA.2, BA.3, BA.4, and BA.5 of the Omicron emerged over time and through mutation, with BA.1 responsible for the most severe global pandemic between December 2021 and January 2022. Other Omicron subvariants such as BQ.1, BQ.1.1, BA.4.6, BF.7, BA.2.75.2, XBB.1 appeared recently and could cause a new wave of increased cases amid the ongoing COVID-19 pandemic. There is evidence that certain Omicron subvariants have increased transmissibility, extra spike mutations, and ability to overcome protective effects of COVID-19 neutralizing antibodies through immunological evasion. In recent months, the Omicron BF.7 subvariant has been in the news due to its spread in China and a small number of other countries, raising concerns about a possible rebound in COVID-19 cases. More recently, the Omicron XBB.1.5 subvariant has captured international attention due to an increase in cases in the United States. As a highly transmissible sublineage of Omicron BA.5, as well as having a shorter incubation time and the potential to reinfect or infect immune population, BF.7 has stronger infection ability. It appears that the regional immunological landscape is affected by the amount and timing of previous Omicron waves, as well as the COVID-19 vaccination coverage, which in turn determines whether the increased immune escape of BF.7 and XBB.1.5 subvariants is sufficient to drive new infection waves. Expanding our understanding of the transmission and efficacy of vaccines, immunotherapeutics, and antiviral drugs against newly emerging Omicron subvariants and lineages, as well as bolstering genomic facilities for tracking their spread and maintaining a constant vigilance, and shedding more light on their evolution and mutational events, would help in the development of effective mitigation strategies. Importantly, reducing the occurrence of mutations and recombination in the virus can be aided by bolstering One health approach and emphasizing its significance in combating zoonosis and reversal zoonosis linked with COVID-19. This article provides a brief overview on Omicron variant, its recently emerging lineages and subvairants with a special focus on BF.7 and XBB.1.5 as much more infectious and highly transmissible variations that may once again threaten a sharp increase in COVID-19 cases globally amid the currently ongoing pandemic, along with presenting salient mitigation measures. © 2022, Editorial board of Journal of Experimental Biology and Agricultural Sciences. All rights reserved.

3.
Infection prevention in practice ; 2022.
Article in English | EuropePMC | ID: covidwho-2073691

ABSTRACT

Background The COVID-19 pandemic has substantially affected the antibiotic stewardship activities in most hospitals of India. Aims We conducted an antibiotic point-prevalence survey (PPS) immediately after the decline of a major COVID-19 wave at a dedicated COVID-19 hospital. By doing so we aimed to identify the antibiotic prescription patterns, identify factors influencing the choice of antibiotics, and identify/develop strategies to improve the antibiotic stewardship program in such setups. Methods The PPS was single-centred, cross-sectional, and retrospective in nature. Patients admitted in various wards and intensive care units (ICUs) between September 2021 to October 2021 were included in our PPS. Results Of the included 460 patients, 192 were prescribed antibiotics. Of these 192 patients, ICU-admitted patients had the highest number of antibiotics prescribed i.e. 2.09 ± 0.92. Only a minor fraction (7.92 %) of antibiotics prescriptions were on the basis of culture reports. Most of the antibiotics were prescribed empirically by the parenteral route. The most common group of antibiotics prescribed were third-generation cephalosporins. Carbapenems were the most common designated antibiotics prescribed. A large number of patients (22.40 %) were prescribed double anaerobic cover. Conclusion The strategies that we identified to improve the antibiotic stewardship program at our institute included reviving the culture of sending culture reports to prescribe antibiotics, improving surgical prophylaxis guidelines, training resident doctors to categorize antibiotic prescriptions appropriately, closely monitoring prescriptions providing double anaerobic cover, and improving the electronic medical record system for improving prescription auditing.

4.
28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2022 ; : 4675-4683, 2022.
Article in English | Scopus | ID: covidwho-2020404

ABSTRACT

We study allocation of COVID-19 vaccines to individuals based on the structural properties of their underlying social contact network. Using a realistic representation of a social contact network for the Commonwealth of Virginia, we study how a limited number of vaccine doses can be strategically distributed to individuals to reduce the overall burden of the pandemic. We show that allocation of vaccines based on individuals' degree (number of social contacts) and total social proximity time is significantly more effective than the usually used age-based allocation strategy in reducing the number of infections, hospitalizations and deaths. The overall strategy is robust even: (i) if the social contacts are not estimated correctly;(ii) if the vaccine efficacy is lower than expected or only a single dose is given;(iii) if there is a delay in vaccine production and deployment;and (iv) whether or not non-pharmaceutical interventions continue as vaccines are deployed. For reasons of implementability, we have used degree, which is a simple structural measure and can be easily estimated using several methods, including the digital technology available today. These results are significant, especially for resource-poor countries, where vaccines are less available, have lower efficacy, and are more slowly distributed. © 2022 Owner/Author.

5.
Ain - Shams Journal of Anesthesiology ; 14(1), 2022.
Article in English | ProQuest Central | ID: covidwho-1879271

ABSTRACT

BackgroundAcute hypoxemic respiratory failure is the most common complication of COVID 19 infection. Newer ways for oxygen therapy were explored during this pandemic. High flow nasal oxygenation (HFNO) emerged as a novel technique for oxygenation and prevented the need for invasive mechanical ventilation during hypoxia among COVID patients. Using high flow oxygen dries the nasal mucosa and leads to skin disruption. We are presenting this case as this complication has not been reported anywhere to our knowledge.Case presentationHere we present a case of a 62-year-old male, who was on HFNO for a long time as a part of treatment for COVID 19 and developed ulceration in the nasal septa. Patient belonged to a geriatric age group and had diabetes mellitus. Close monitoring by ICU (intensive care unit) staff was a big problem during this pandemic. Daily physical assessment, good nutrition, and daily dressing with plastic surgery consultation helped treat our patient.ConclusionsGeriatric patients with other co-morbidities are vulnerable to mucosal injury. Even in COVID era, everyday general physical surveillance is very vital in such patients to prevent these complications. During this pandemic close monitoring of patients suffered due to scarcity of ICU staff. In spite of that, it is a must to ensure daily physical surveillance and good supplemental nutrition especially in geriatric patients.

6.
9th International Workshop on Engineering Multi-Agent Systems, EMAS 2021 ; 13190 LNAI:1-21, 2022.
Article in English | Scopus | ID: covidwho-1777659

ABSTRACT

Agent-based simulation is increasingly being used to model social phenomena involving large numbers of agents. However, existing agent-based simulation platforms severely limit the kinds of the social phenomena that can modeled, as they do not support large scale simulations involving agents with complex behaviors. In this paper, we present a scalable agent-based simulation framework that supports modeling of complex social phenomena. The framework integrates a new simulation platform that exploits distributed computer architectures, with an extension of a multi-agent programming technology that allows development of complex deliberative agents. To show the scalability of our framework, we briefly describe its application to the development of a model of the spread of COVID-19 involving complex deliberative agents in the US state of Virginia. © 2022, Springer Nature Switzerland AG.

7.
Open Forum Infectious Diseases ; 8(SUPPL 1):S104-S106, 2021.
Article in English | EMBASE | ID: covidwho-1746765

ABSTRACT

Background. The COVID-19 pandemic was associated with a significant (28%) reduction of methicillin-resistant Staphylococcus aureus (MRSA) acquisition at UVA Hospital (P=0.016). This "natural experiment" allowed us to analyze 3 key mechanisms by which the pandemic may have influenced nosocomial transmission: 1) enhanced infection control measures (i.e., barrier precautions and hand hygiene), 2) patient-level risk factors, and 3) networks of healthcare personnel (HCP)-mediated contacts. Hospital MRSA acquisition was defined as a new clinical or surveillance positive in patients with prior unknown or negative MRSA status occurring >72h after admission. 10 month time periods pre- (5/6/2019 to 2/23/2020) and post-COVID-19 (5/4/2020 to 2/28/2021) were chosen to mitigate the effects of seasonality. A 6-week wash-in period was utilized coinciding with the onset of several major hospital-wide infection control measures (opening of 2 special pathogen units with universal contact/airborne precautions on 4/1/21 and 5/1/21, universal mask 4/10/21 and eye protection 4/20/20 policies instituted along with staff education efforts including the importance of standard precautions). Box and whisker plots depict quartile ranges, median (dotted line), and mean values. Mean MRSA acquisition rates pre- (0.92 events per 1,000 patient days) significantly declined post-COVD-19 (to 0.66;P=0.016). Independent-samples t tests were used (2-tailed) for statistical comparisons except for variables without a normal distribution (Shorr Scores), for which a Mann-Whitney U test was used. Methods. Census-adjusted hospital-acquired MRSA acquisition events were analyzed over 10 months pre- (5/6/2019 to 2/23/2020) and post-COVD-19 (5/4/2020 to 2/28/2021), with a 6-week wash-in period coinciding with hospital-wide intensification of infection control measures (e.g., universal masking). HCP hand hygiene compliance rates were examined to reflect adherence to infection control practices. To examine impacts of non-infection control measures on MRSA transmission, we analyzed pre/post-COVD-19 differences in individual risk profiles for MRSA acquisition as well as a broad suite of properties of the hospital social network using person-location and person-person interactions inferred from the electronic medical record. Figure 2. Social Network Construction We constructed a contact network of hospitalized patients and staff at University of Virginia Hospital to analyze the properties of both person-location and person-person networks and their changes pre- and post-COVID-19. Colocation data (inferred from shared patient rooms and healthcare personnel (HCP)-patient interactions recorded in the electronic health record, e.g., medication administration) were used to construct contact networks, with nodes representing patients and HCP, and edges representing contacts. The above schematic shows how the temporal networks are inferred. In the figure, circles represent patients and the small filled squares represent HCP, while the larger rectangles represent patient rooms. The first room is a shared room with two patients. At each time step, co-location is inferred from the EMR data, which specifies interactions between HCP and patients. This can be represented as the temporal network (t) at the bottom. Results. Hand hygiene compliance significantly improved post-COVD-19, in parallel with other infection control measures. Patient Shorr Scores (an index of individual MRSA risk) were statistically similar pre-/post-COVD-19. Analysis of various network properties demonstrated no trends to suggest a reduced outbreak threshold post-COVD-19. Figure 3. Hand Hygiene Compliance Rates Analysis of hospital-wide hand hygiene auditing data (anonymous auditors deployed to various units across UVA Hospital with an average 1,710 observations per month (range 340 - 7,187)) demonstrated a statistically significant (6%) improvement in average monthly hand hygiene compliance (86.9% pre- versus 93.1% post-COVD-19;P=0.008). Figure 4. Individual MRSA Risk Factors We calculated the Shorr Score (a validated tool to estimate individual risk for MRSA carriage in hospitalized patients;Shorr et al. Arch Intern Med. 2008;168(20):2205-10) for patients using data from the electronic health record to test the hypothesis that individual risk factors in aggregate did not change significantly in the post-COVD-19 period to explain changes in MRSA acquisition. Values for this score ranged from 0 to 10 with the following criteria: recent hospitalization (4), nursing home residence (3), hemodialysis (2), ICU admission (1). Pictured are frequency distributions of Shorr scores in the pre-COVID-19 and post-COVID-19 periods. The Mann-Whitney effect size (E), 0.53 (P=0.51), indicated that pre- and post-COVD-19 distributions were very similar. We analyzed three major types of network properties for this analysis: (1) Node properties of the pre- and post-COVID-19 networks consisted of all the edges in the pre- and post-COVID-19 periods, respectively. We considered a number of standard properties used in social network analysis to quantify opportunities for patient-patient transmission: degree centrality (links held by each node), betweenness centrality (times each node acts as the shortest 'bridge' between two other nodes), closeness centrality (how close each node is to other nodes in network), Eigenvector centrality (node's relative influence on the network), and clustering coefficient (degree to which nodes cluster together) in the first five panels (left to right, top to bottom);(Newman, Networks: An Introduction, 2010). Each panel shows the frequency distributions of these properties. These properties generally did not have a normal distribution and therefore we used a Mann Whitney U test on random subsets of nodes in these networks to compare pre- and post-COVID properties. The mean effect size (E) and P-values are shown for each metric in parenthesis. We concluded that all of these pre- versus post-COVID-19 network properties were statistically similar. (2) Properties of the ego networks (networks induced by each node and its 'one-hop' neighbors). We considered density (average number of neighbors for each node;higher density generally favors lower outbreak threshold) and degree centrality (number of links held by each node) of ego networks (middle right and bottom left panels). The mean effect size and p-values using the Mann Whitney test are shown in parenthesis;there were no statistically significant differences in these properties in the pre- and post-COVID networks. (3) Aggregate properties of the weekly networks, consisting of all the interactions within a week. We considered modularity (measure of how the community structure differs from a random network;higher modularity means a stronger community structure and lower likelihood of transmission) and density (average number of neighbors each node;higher density generally favors lower outbreak threshold) of the weekly networks (bottom middle and bottom right panels). The modularity in the post-COVID weekly networks was slightly lower (i.e., it has a weaker community structure, and the network is more well mixed), while density was slightly higher, the differences of which were statistically significant;a caveat is that these are relatively small datasets (about 40 weeks). These differences (higher density, and better connectivity) both increase the risk of transmission in the post-COVID networks. In summary, the post-COVID networks either have similar properties as the pre-COVID networks, or had changes which are unlikely to have played a role in reducing MRSA transmission. Conclusion. A significant reduction in post-COVD-19 MRSA transmission may have been an unintended positive effect of enhanced infection control measures, particularly hand hygiene and increased mask use. A modest (11.6%) post-COVD-19 reduction in surveillance testing may have also played a role. Despite pandemic-related cohorting and census fluctuations, most network properties were not significantly different post-COVID-19, except for aggregate density and modularity which varied in a directio that instead favored transmission;therefore, HCP-based networks did not play a significant role in reducing MRSA transmission. Multivariate modeling to isolate relative contributions of these factors is underway. Figure 6. Surveillance Testing and Clinical Culturing Post-COVD-19, there was a modest (11.6%) but statistically significant reduction in surveillance PCR testing (42.4 mean tests per 1,000 patient days pre- versus 37.5 post-COVD-19;P<0.002). There was not a statistically significant difference in rates of clinical cultures sent (2.48 cultures per 1,000 patient days pre- versus 2.23 post-COVD-19;P=0.288).

8.
2021 IEEE International Conference on Big Data, Big Data 2021 ; : 1566-1574, 2021.
Article in English | Scopus | ID: covidwho-1730887

ABSTRACT

We study the role of vaccine acceptance in controlling the spread of COVID-19 in the US using AI-driven agent-based models. Our study uses a 288 million node social contact network spanning all 50 US states plus Washington DC, comprised of 3300 counties, with 12.59 billion daily interactions. The highly-resolved agent-based models use realistic information about disease progression, vaccine uptake, production schedules, acceptance trends, prevalence, and social distancing guidelines. Developing a national model at this resolution that is driven by realistic data requires a complex scalable workflow, model calibration, simulation, and analytics components. Our workflow optimizes the total execution time and helps in improving overall human productivity.This work develops a pipeline that can execute US-scale models and associated workflows that typically present significant big data challenges. Our results show that, when compared to faster and accelerating vaccinations, slower vaccination rates due to vaccine hesitancy cause averted infections to drop from 6.7M to 4.5M, and averted total deaths to drop from 39.4K to 28.2K nationwide. This occurs despite the fact that the final vaccine coverage is the same in both scenarios. Improving vaccine acceptance by 10% in all states increases averted infections from 4.5M to 4.7M (a 4.4% improvement) and total deaths from 28.2K to 29.9K (a 6% increase) nationwide. The analysis also reveals interesting spatio-temporal differences in COVID-19 dynamics as a result of vaccine acceptance. To our knowledge, this is the first national-scale analysis of the effect of vaccine acceptance on the spread of COVID-19, using detailed and realistic agent-based models. © 2021 IEEE.

9.
22nd International Workshop on Multi-Agent-Based Simulation, MABS 2021 ; 13128 LNAI:99-112, 2022.
Article in English | Scopus | ID: covidwho-1680637

ABSTRACT

Modelling social phenomena in large-scale agent-based simulations has long been a challenge due to the computational cost of incorporating agents whose behaviors are determined by reasoning about their internal attitudes and external factors. However, COVID-19 has brought the urgency of doing this to the fore, as, in the absence of viable pharmaceutical interventions, the progression of the pandemic has primarily been driven by behaviors and behavioral interventions. In this paper, we address this problem by developing a large-scale data-driven agent-based simulation model where individual agents reason about their beliefs, objectives, trust in government, and the norms imposed by the government. These internal and external attitudes are based on actual data concerning daily activities of individuals, their political orientation, and norms being enforced in the US state of Virginia. Our model is calibrated using mobility and COVID-19 case data. We show the utility of our model by quantifying the benefits of the various behavioral interventions through counterfactual runs of our calibrated simulation. © 2022, Springer Nature Switzerland AG.

10.
British Journal of Surgery ; 108:1, 2021.
Article in English | Web of Science | ID: covidwho-1539248
11.
British Journal of Surgery ; 108:1, 2021.
Article in English | Web of Science | ID: covidwho-1539247
12.
35th IEEE International Parallel and Distributed Processing Symposium, IPDPS 2021 ; : 639-650, 2021.
Article in English | Scopus | ID: covidwho-1393745

ABSTRACT

The COVID-19 global outbreak represents the most significant epidemic event since the 1918 influenza pandemic. Simulations have played a crucial role in supporting COVID-19 planning and response efforts. Developing scalable workflows to provide policymakers quick responses to important questions pertaining to logistics, resource allocation, epidemic forecasts and intervention analysis remains a challenging computational problem. In this work, we present scalable high performance computing-enabled workflows for COVID-19 pandemic planning and response. The scalability of our methodology allows us to run fine-grained simulations daily, and to generate county-level forecasts and other counterfactual analysis for each of the 50 states (and DC), 3140 counties across the USA. Our workflows use a hybrid cloud/cluster system utilizing a combination of local and remote cluster computing facilities, and using over 20, 000 CPU cores running for 6-9 hours every day to meet this objective. Our state (Virginia), state hospital network, our university, the DOD and the CDC use our models to guide their COVID-19 planning and response efforts. We began executing these pipelines March 25, 2020, and have delivered and briefed weekly updates to these stakeholders for over 30 weeks without interruption. © 2021 IEEE.

13.
2nd International Conference on Intelligent Engineering and Management, ICIEM 2021 ; : 271-276, 2021.
Article in English | Scopus | ID: covidwho-1280228

ABSTRACT

In this article, the authors propose a scheme, BloCoV6, that integrates sixth-generation (6G)-assisted unmanned aerial vehicles (UAVs) and blockchain (BC) to monitor mass surveillance of persons in dense areas, and implement a trust-based contact-tracing ecosystem in BC. The scheme operates in two phases. In the first phase, based on the area density, and the number of users, UAVs swarms are mounted with thermal imaging sensors that monitor the body temperature of persons. The collected data are sent to ground stations in real-time, through 6G network services. Once, the images are analyzed, the details of potential COVID-19 patients are identified, and their travel and contact records are fetched and stored in BC. Then, in the second phase, the contact-tracing information is validated in BC. The proposed scheme is simulated for smart contracts (SC) functionalities, UAV observations, latency, spectral efficiency, and transaction and signing costs. The obtained results indicate the scheme viability. For example, 6G has a low latency of 330.8 milliseconds (ms), which outperforms 1200.1 ms in fifth-generation (5G) channels. The observed spectral efficiency of 6G channels is 5-10× higher than 5G, and the average signing and transaction cost is 3.473 seconds (s), and 6.873 s respectively, which outperforms the conventional schemes. © 2021 IEEE.

14.
Acs Environmental Science and Technology Water ; 1(1):8-10, 2021.
Article in English | Web of Science | ID: covidwho-1272823
15.
Annals of the Romanian Society for Cell Biology ; 25(4):11922-11934, 2021.
Article in English | Scopus | ID: covidwho-1227406

ABSTRACT

COVID 19 Pandemic has devastated the world in 2020 with more than 86 million cases and the 18 lakh deaths worldwide. Initially considered as a respiratory infection, more and more studies point towards coagulopathy as a primary mechanism of organ damage and death caused by this virus. The purpose of this study is to look for raised D-dimer level in COVID-19 patients and assess whether they correlated with severity and outcome of the disease. Study was carried out in 100 patients with COVID-19. The mean D-dimer level on admission was 1.064 mg/L. The study finds significantly higher mean D-dimer levels (p=0.0) and significantly higher number of patients with raised (≥ 0.5 mg/L) D-dimer levels (p=0.00) in Critical and Severe Groups as compared to Moderate and Mild groups of patients. There were 48 patients with normal D-dimer (<0.5 mg/L) levels out of which 3 patients died and 52 patients with raised (≥ 0.5 mg/L) D-dimer levels, out of which 18 patients died (p=0.01).There was significantly higher number of patients requiring supplemental oxygen among patients with raised D-dimer levels (p=0.00) in Moderate group. © 2021, Annals of R.S.C.B. All rights reserved.

16.
Indian Journal of Critical Care Medicine ; 25(SUPPL 1):S7, 2021.
Article in English | EMBASE | ID: covidwho-1200231

ABSTRACT

Introduction: About 20 to 30% of COVID-19 patients admitted to ICU develop severe ARDS. Tracheal intubation in such patients carries a high risk of complications and mortality.1,2 High-flow nasal oxygen therapy (HFNOT) is an attractive option as it can reduce the requirement of intubation. Objectives: This study aimed to determine the impact of HFNOT on the oxygenation level as well as HFNOT failure. The primary objective was to determine the change in PaO2/FiO2 ratio from baseline to at 1 hour, 6 hours, and 7 days of HFNOT initiation in COVID-19 critically ill patients presenting with acute hypoxemic respiratory failure (AHRF). The secondary objective was to determine the HFNOT failure rate [i.e., the requirement of tracheal intubation or noninvasive ventilation (NIV)]. Materials and methods: After approval from the institutional ethics committee and written informed consent, a prospective observational study was performed over a period of 3 months at the COVID intensive care unit of a government institute-hospital in east India. Adult patients (aged >18 years) with confirmed COVID-positive status (SARS-CoV-2 detected in nasopharyngeal swab by real-time reverse transcription-polymerase chain reaction assay), having AHRF (PaO2/ FiO2 ratio <300) who are not able to maintain saturation above 90% on standard oxygen therapy were included in this study. On the HFNOT device, the initial flow rate and FiO2 were set at 60 L/minute and 100%, respectively. On case record form (CRF), demographic characteristics, vital signs, laboratory tests, and arterial blood gas tests were recorded. ROX index (ratio of SpO2/FiO2 to respiratory rate) was calculated at 2 hours of HFNOT. Continuous variables were reported as mean or median values when appropriate. The intergroup differences were analyzed using Student's t-test or Mann-Whitney U test. The intragroup differences between variables at different time points were analyzed using paired Student's t-test. p ≤ 0.05 was considered statistically significant. Statistical analysis will be performed using SPSS software. Results: A total of 265 patients were screened out of which 256 patients had AHRF. HFNOT was used as first line therapy in 122 patients in which only 108 found to be eligible for study inclusion. Mean age of the patients was 59.7 ± 15.1 years;male patients accounted for majority (79.6%) of HFNOT cohort. Key comorbidities were diabetes mellitus (48.1%) and hypertension (25.9%). Mean PaO2/FiO2 ratio at baseline was 96.8 ± 30.2) which significantly increased at 1 hour (114.8 ± 32.1, p < 0.001) at 6-hours (130.1± 36.5;p < 0.001) and at 7 days (178.7 ± 41.3;p < 0.001) Mean duration of HFNOT was 10.4 ± 4.9 days. Median (with range) APACHE II and SOFA scores were 22 (12-35) and 8 (4-12) respectively. HFNOT failure rate was 27.8%. NIV was used as ceiling respiratory support in 22.2% of HFNOT cohort. Mean ROX index was significantly higher for the patients who successfully continued on HFNOT compared to those who failed (3.4 ± 0.3 vs 2.8+0.3;p < 0.001). Mean admission glucose level, D dimer and IL-6 values were significantly higher in HFNOT failure group compared to HFNOT success group. Overall, 28-day mortality rate in this cohort was 25.9%. About 50% patients receiving HFNOT developed complications in which epistaxis (18.5%) and air hunger (16.7%) were the most common complications. Discussions: This study prospectively highlights the significant impact of HFNOT on oxygenation status over time points studied (i.e., at 1 hour, 6 hours, and 7 days). The baseline mean PaO2/FiO2 ratio was <100 (severe ARDS) when HFNOT was initiated. With such a low P/F ratio, HFNOT remarkably outperformed with a success rate of 72.2%. Significant improvement in the P/F ratio may be explained by adequate flow delivery and FiO2 meeting the patients' demand. The main strengths of the study were its prospective nature and large cohort. The main limitation is that being a single-center study, the results from the study need to be cautiously interpreted before extrapolating to patients in differ nt geographical locations. About 50% of patients developed mild complications, such as, epistaxis, air hunger, and abdominal distension;however, one patient also developed spontaneous tension pneumothorax which required immediate intercostal drain tube placement following which, the patient dramatically improved and survived hospital discharge. We observed admission hyperglycemia, high D-dimer value, and IL-6 levels in patients who failed HFNOT. These findings are consistent with other studies.3-5 Conclusion: HFNOT significantly improves oxygenation level in COVID-19 patients developing acute hypoxemic respiratory failure. The HFNOT failure rate was 27.8%.

17.
Proc. - IEEE Int. Conf. Big Data, Big Data ; : 1380-1387, 2020.
Article in English | Scopus | ID: covidwho-1186068

ABSTRACT

The COVID-19 pandemic brought to the forefront an unprecedented need for experts, as well as citizens, to visualize spatio-temporal disease surveillance data. Web application dashboards were quickly developed to fill t his g ap, b ut a ll of these dashboards supported a particular niche view of the pandemic (ie, current status or specific r egions). I n t his paper, we describe our work developing our COVID-19 Surveillance Dashboard, which offers a unique view of the pandemic while also allowing users to focus on the details that interest them. From the beginning, our goal was to provide a simple visual tool for comparing, organizing, and tracking near-real-time surveillance data as the pandemic progresses. In developing this dashboard, we also identified 6 key metrics which we propose as a standard for the design and evaluation of real-time epidemic science dashboards. Our dashboard was one of the first released to the public, and continues to be actively visited. Our own group uses it to support federal, state and local public health authorities, and it is used by individuals worldwide to track the evolution of the COVID-19 pandemic, build their own dashboards, and support their organizations as they plan their responses to the pandemic. © 2020 IEEE.

18.
Indian Pediatrics ; 57(12):1177-1180, 2020.
Article in English | EMBASE | ID: covidwho-1064630

ABSTRACT

We conducted this online survey to assess the parental perspectives on remote learning, the associated stress, and school reopening during the COVID-19 pandemic. Of 2694 responses, 2032 (75.4%) parents perceived remote learning to be stressful for the child and 1902 (70.6%) for the family. The mean (SD) duration of remote learning was 3.2 (2.1) hours/day and 5.3 (1.0) days/week. Parents from 1637 (61.7%) families reported headaches and eye strain in children. Starting regular school was not acceptable to 1946 (72.2%) parents.

SELECTION OF CITATIONS
SEARCH DETAIL